skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Chen, Kai-Feng"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Abstract Measuring one-point statistics in redshifted 21 cm intensity maps offers an opportunity to explore non-Gaussian features of the early Universe. We assess the impact of instrumental effects on measurements made with the Hydrogen Epoch of Reionization Array (HERA) by forward modeling observational and simulation data. Using HERA Phase I observations over 94 nights, we examine the second (m2, variance) and third (m3) moments of images. We employ theDAYENU-filtering method for foreground removal and reduce simulated foreground residuals to 10% of the 21 cm signal residuals. In noiseless cosmological simulations, the amplitudes of one-point statistics measurements are significantly reduced by the instrument response and further reduced by wedge-filtering. Analyses with wedge-filtered observational data, along with expected noise simulations, show that systematics alter the probability distribution of the map pixels. A likelihood analysis based on the observational data showsm2measurements disfavor the cold reionization model characterized by inefficient X-ray heating, in line with other power spectra measurements. Small signals inm3due to the instrument response of the Phase I observation and wedge-filtering make it challenging to use these non-Gaussian statistics to explore model parameters. Forecasts with the full HERA array predict high signal-to-noise ratios form2,m3, andS3assuming no foregrounds, but wedge-filtering drastically reduces these ratios. This work demonstrates conclusively that a comprehensive understanding of instrumental effects onm2andm3is essential for their use as a cosmological probe, given their dependence on the underlying model. 
    more » « less
    Free, publicly-accessible full text available November 3, 2026
  2. Abstract This paper presents the design and deployment of the Hydrogen Epoch of Reionization Array (HERA) phase II system. HERA is designed as a staged experiment targeting 21 cm emission measurements of the Epoch of Reionization. First results from the phase I array are published as of early 2022, and deployment of the phase II system is nearing completion. We describe the design of the phase II system and discuss progress on commissioning and future upgrades. As HERA is a designated Square Kilometre Array pathfinder instrument, we also show a number of “case studies” that investigate systematics seen while commissioning the phase II system, which may be of use in the design and operation of future arrays. Common pathologies are likely to manifest in similar ways across instruments, and many of these sources of contamination can be mitigated once the source is identified. 
    more » « less
  3. Abstract Many measurements at the LHC require efficient identification of heavy-flavour jets, i.e. jets originating from bottom (b) or charm (c) quarks. An overview of the algorithms used to identify c jets is described and a novel method to calibrate them is presented. This new method adjusts the entire distributions of the outputs obtained when the algorithms are applied to jets of different flavours. It is based on an iterative approach exploiting three distinct control regions that are enriched with either b jets, c jets, or light-flavour and gluon jets. Results are presented in the form of correction factors evaluated using proton-proton collision data with an integrated luminosity of 41.5 fb -1 at  √s = 13 TeV, collected by the CMS experiment in 2017. The closure of the method is tested by applying the measured correction factors on simulated data sets and checking the agreement between the adjusted simulation and collision data. Furthermore, a validation is performed by testing the method on pseudodata, which emulate various mismodelling conditions. The calibrated results enable the use of the full distributions of heavy-flavour identification algorithm outputs, e.g. as inputs to machine-learning models. Thus, they are expected to increase the sensitivity of future physics analyses. 
    more » « less